首页> 外文OA文献 >Support Consistency of Direct Sparse-Change Learning in Markov Networks
【2h】

Support Consistency of Direct Sparse-Change Learning in Markov Networks

机译:支持马尔可夫网络直接稀疏变化学习的一致性

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

We study the problem of learning sparse structure changes between two Markovnetworks $P$ and $Q$. Rather than fitting two Markov networks separately to twosets of data and figuring out their differences, a recent work proposed tolearn changes \emph{directly} via estimating the ratio between two Markovnetwork models. In this paper, we give sufficient conditions for\emph{successful change detection} with respect to the sample size $n_p, n_q$,the dimension of data $m$, and the number of changed edges $d$. When using anunbounded density ratio model we prove that the true sparse changes can beconsistently identified for $n_p = \Omega(d^2 \log \frac{m^2+m}{2})$ and $n_q =\Omega({n_p^2})$, with an exponentially decaying upper-bound on learning error.Such sample complexity can be improved to $\min(n_p, n_q) = \Omega(d^2 \log\frac{m^2+m}{2})$ when the boundedness of the density ratio model is assumed.Our theoretical guarantee can be applied to a wide range of discrete/continuousMarkov networks.
机译:我们研究了学习两个Markov网络$ P $和$ Q $之间的稀疏结构变化的问题。与其将两个马尔可夫网络分别拟合为两组数据并弄清它们之间的差异,最近的工作还建议通过估计两个马尔可夫网络模型之间的比率来直接学习变化。在本文中,我们针对样本量$ n_p,n_q $,数据$ m $的维数和更改的边数$ d $给出了\ emph {成功的变化检测}的充分条件。当使用无穷大的密度比模型时,我们证明了对于$ n_p = \ Omega(d ^ 2 \ log \ frac {m ^ 2 + m} {2})$和$ n_q = \ Omega({ n_p ^ 2})$,由于学习错误,指数上限呈指数递减。这种样本复杂度可以提高到$ \ min(n_p,n_q)= \ Omega(d ^ 2 \ log \ frac {m ^ 2 + m } {2})$(假设密度比模型的有界性)。我们的理论保证可应用于广泛的离散/连续马尔可夫网络。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号